hidden Markov model

Terms from Artificial Intelligence: humans at the heart of algorithms

A hidden Markov model (HMM) extends the Markov model by adding a hidden state. This can be used to model natural phenomena, such as weather systems that ave some sort of longer term meteorological state as well as the immediately measurable rain or shine. Formally, given a {[time series}} or sequence of observable events/tokens, while a simple Markov model has a matrix of probabilities Mjk giving the probability that event j comes after event k; instead the hidde Markov model has more complex matrix Hjs;kt which gives the probabilty that given an observable event j and (unobservable) hidden state s, the next event will be k and the hidden state t.

. Infering a HMM from traing sequences is more diffocult than training a plain Markov model as the hidden state isn not avaiable in the trainng data and so must itself be ftted as part of the training process.
Note that higher order Markiv modles (those that take into account several past observable events), which are easier to learns, can always be represented as HMMs by simply taking the hidden state to be the past tokens. This can be one way the bootstrap the HMM training process.

Defined on page 326

Used on pages 305, 326, 327, 328, 331, 338, 340, 410

Also known as HMM